ISSN 0439-755X
CN 11-1911/B

Acta Psychologica Sinica ›› 2026, Vol. 58 ›› Issue (2): 308-322.doi: 10.3724/SP.J.1041.2026.0308

• Reports of Empirical Studies • Previous Articles     Next Articles

“Zero-Shot Language Learning”: Can Large Language Models “Acquire” Contextual Emotion Like Humans?

WU Shiyu(), WANG Yiyun   

  1. School of Foreign Languages, Shanghai Jiao Tong University, Shanghai 200240, China
  • Published:2026-02-25 Online:2025-12-03
  • Contact: WU Shiyu E-mail:shiyuw@sjtu.edu.cn

Abstract:

This study examines whether large language models (LLMs) can, under “zero-shot” conditions, acquire the contextual emotion of discourse in which novel words appear through reading-based incidental vocabulary learning, and evaluates how context emotionality (positive/neutral/negative) and context variability (repeated vs. varied) jointly influence lexical learning. Using a human-model comparison paradigm, four LLMs and three groups of human learners learned target pseudowords embedded in contexts differing in context emotionality and variability based on a shared set of materials. Multiple post-tests were administered to assess the transfer of the affect and the learning of word forms and meanings. Results showed that LLMs, mirroring human learners, successfully transferred the affective content of the context to the target words and maintained emotional consistency in language production. Both LLMs and humans exhibited a “positivity advantage” and a “context variability advantage,” and an interaction between context emotionality and context variability emerged in the definition-generation task. We propose a “dual-mechanism framework,” arguing that LLMs display human-like emotional semantic learning at the functional level, but their underlying mechanism—rooted in statistical co-occurrence and vector-space optimization—is fundamentally different from humans’ embodied and socially grounded processing. The findings have implications for affective computing, the ethics of human-AI interaction, and vocabulary instruction.

Key words: Large Language Models, zero-shot learning, emotion learning